Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization

نویسندگان

  • Jun Suzuki
  • Masaaki Nagata
چکیده

This paper tackles the reduction of redundant repeating generation that is often observed in RNN-based encoder-decoder models. Our basic idea is to jointly estimate the upper-bound frequency of each target vocabulary in the encoder and control the output words based on the estimation in the decoder. Our method shows significant improvement over a strong RNN-based encoder-decoder baseline and achieved its best results on an abstractive summarization benchmark.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Abstractive Document Summarization with a Graph-Based Attentional Neural Model

Abstractive summarization is the ultimate goal of document summarization research, but previously it is less investigated due to the immaturity of text generation techniques. Recently impressive progress has been made to abstractive sentence summarization using neural models. Unfortunately, attempts on abstractive document summarization are still in a primitive stage, and the evaluation results...

متن کامل

A Review on Abstractive Summarization Methods

Text summarization is the process of extracting salient information from the source text and to present that information to the user in the form of summary. It is very difficult for human beings to manually summarize large documents of text. Automatic abstractive summarization provides the required solution but it is a challenging task because it requires deeper analysis of text. In this paper,...

متن کامل

Opinosis: A Graph Based Approach to Abstractive Summarization of Highly Redundant Opinions

We present a novel graph-based summarization framework (Opinosis) that generates concise abstractive summaries of highly redundant opinions. Evaluation results on summarizing user reviews show that Opinosis summaries have better agreement with human summaries compared to the baseline extractive method. The summaries are readable, reasonably well-formed and are informative enough to convey the m...

متن کامل

TL;DR: Improving Abstractive Summarization Using LSTMs

Traditionally, summarization has been approached through extractive methods. However, they have produced limited results. More recently, neural sequence-tosequence models for abstractive text summarization have shown more promise, although the task still proves to be challenging. In this paper, we explore current state-of-the-art architectures and reimplement them from scratch. We begin with a ...

متن کامل

Neural Abstractive Text Summarization

Abstractive text summarization is a complex task whose goal is to generate a concise version of a text without necessarily reusing the sentences from the original source, but still preserving the meaning and the key contents. We address this issue by modeling the problem as a sequence to sequence learning and exploiting Recurrent Neural Networks (RNNs). This work is a discussion about our ongoi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017